Sparse Superposition Codes: Fast and Reliable at Rates Approaching Capacity with Gaussian Noise
نویسندگان
چکیده
For the additive white Gaussian noise channel with average codeword power constraint, sparse superposition codes are developed. Both encoding and decoding are computationally feasible. The codewords are linear combinations of subsets of vectors from a given dictionary, with the possible messages indexed by the choice of subset. An adaptive successive decoder is developed, with which communication is shown to be reliable with error probability exponentially small for all rates below the Shannon capacity.
منابع مشابه
Fast Sparse Superposition Codes have Exponentially Small Error Probability for R < C
—For the additive white Gaussian noise channel with average codeword power constraint, sparse superposition codes are developed. These codes are based on the statistical high-dimensional regression framework. The paper [IEEE Trans. Inform. Theory 55 (2012), 2541 – 2557] investigated decoding using the optimal maximum-likelihood decoding scheme. Here a fast decoding algorithm, called adaptive su...
متن کاملThe Error Probability of Sparse Superposition Codes with Approximate Message Passing Decoding
Sparse superposition codes, or sparse regression codes (SPARCs), are a recent class of codes for reliable communication over the AWGN channel at rates approaching the channel capacity. Approximate message passing (AMP) decoding, a computationally efficient technique for decoding SPARCs, has been proven to be asymptotically capacity-achieving for the AWGN channel. In this paper, we refine the as...
متن کاملRateless codes approaching capacity of the Gaussian channel
We study codes for the AWGN channel. We build upon the recently proposed sparse superposition codes of Barron and Joseph, which for any constant backoff from capacity can achieve exponential error probability with polynomial decoding time as long as the block-length is greater than a fixed constant (which depends on the backoff). Our contribution is twofold: • First, we give a variant of the co...
متن کاملAn Improved Analysis of Least Squares Superposition Codes with Bernoulli Dictionary
For the additive white Gaussian noise channel with average power constraint, sparse superposition codes, proposed by Barron and Joseph in 2010, achieve the capacity. While the codewords of the original sparse superposition codes are made with a dictionary matrix drawn from a Gaussian distribution, we consider the case that it is drawn from a Bernoulli distribution. We show an improved upper bou...
متن کاملUniversal Sparse Superposition Codes with Spatial Coupling and GAMP Decoding
Sparse superposition codes, or sparse regression codes, constitute a new class of codes which was first introduced for communication over the additive white Gaussian noise (AWGN) channel. It has been shown that such codes are capacity-achieving over the AWGN channel under optimal maximum-likelihood decoding as well as under various efficient iterative decoding schemes equipped with power alloca...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2011